Deep factorisation of the stable process
نویسنده
چکیده
The Lamperti–Kiu transformation for real-valued self-similar Markov processes (rssMp) states that, associated to each rssMp via a space-time transformation, is a Markov additive process (MAP). In the case that the rssMp is taken to be an α-stable process with α ∈ (0, 2), [12] and [20] have computed explicitly the characteristics of matrix exponent of the semi-group of the embedded MAP, which we henceforth refer to as the Lamperti-stable MAP. Specifically, the matrix exponent of the Lamperti-stable MAP’s transition semi-group can be written in a compact form using only gamma functions. Just as with Lévy processes, there exists a factorisation of the (matrix) exponents of MAPs, with each of the two factors uniquely characterising the ascending and descending ladder processes, which themselves are again MAPs. To the author’s knowledge, not a single example of such a factorisation currently exists in the literature. In this article we provide a completely explicit Wiener–Hopf factorisation for the Lampertistable MAP. As a consequence of our methodology, we also get additional new results concerning the space-time invariance properties of stable processes. Accordingly we develop some new fluctuation identities therewith.
منابع مشابه
Ultrasonographic Comparison of Deep Lumbopelvic Muscles Activity in Plank Movements on Stable and Unstable Surface
Purpose: The body core facilitates the transition of forces and moments between the upper and lower extremities in every movement. The present study investigated the differences in the sonographic activity of the deep lumbopelvic muscles during the implementation of plank movements on stable and unstable surfaces. Methods: In total, 16 female athletes with the Mean±SD age of 23.69±3.57 years, ...
متن کاملDeep Multi-task Representation Learning: A Tensor Factorisation Approach
Most contemporary multi-task learning methods assume linear models. This setting is considered shallow in the era of deep learning. In this paper, we present a new deep multi-task representation learning framework that learns cross-task sharing structure at every layer in a deep network. Our approach is based on generalising the matrix factorisation techniques explicitly or implicitly used by m...
متن کاملThe factorisation Forest Theorem
This chapter is devoted to the presentation of the factorisation forest theorem, a deep result due to Simon, which provides advanced Ramsey-like arguments in the context of algebra, automata, and logic. We present several proofs and several variants the result, as well as applications.
متن کاملA New Global Analysis of Deep Inelastic Scattering Data
A new QCD analysis of Deep Inelastic Scattering (DIS) data is presented. All available neutrino and anti-neutrino cross sections are reanalysed and included in the fit, along with charged-lepton DIS and Drell-Yan data. A massive factorisation scheme is used to describe the charm component of the structure functions. Next-to-leading order parton distribution functions are provided. In particular...
متن کاملElastic Meson Production – Factorisation and Gauge Invariance *
The factorisation of the hard amplitude for exclusive meson production in deep inelastic scattering is considered in the framework of a simple model. It is demonstrated explicitly how gauge invariance ensures the cancellation of non-factorising contributions.
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2014